115 research outputs found
Second-order refined peaks-over-threshold modelling for heavy-tailed distributions
Modelling excesses over a high threshold using the Pareto or generalized
Pareto distribution (PD/GPD) is the most popular approach in extreme value
statistics. This method typically requires high thresholds in order for the
(G)PD to fit well and in such a case applies only to a small upper fraction of
the data. The extension of the (G)PD proposed in this paper is able to describe
the excess distribution for lower thresholds in case of heavy tailed
distributions. This yields a statistical model that can be fitted to a larger
portion of the data. Moreover, estimates of tail parameters display stability
for a larger range of thresholds. Our findings are supported by asymptotic
results, simulations and a case study.Comment: to appear in the Journal of Statistical Planning and Inferenc
Modelling Censored Losses Using Splicing: a Global Fit Strategy With Mixed Erlang and Extreme Value Distributions
In risk analysis, a global fit that appropriately captures the body and the
tail of the distribution of losses is essential. Modelling the whole range of
the losses using a standard distribution is usually very hard and often
impossible due to the specific characteristics of the body and the tail of the
loss distribution. A possible solution is to combine two distributions in a
splicing model: a light-tailed distribution for the body which covers light and
moderate losses, and a heavy-tailed distribution for the tail to capture large
losses. We propose a splicing model with a mixed Erlang (ME) distribution for
the body and a Pareto distribution for the tail. This combines the flexibility
of the ME distribution with the ability of the Pareto distribution to model
extreme values. We extend our splicing approach for censored and/or truncated
data. Relevant examples of such data can be found in financial risk analysis.
We illustrate the flexibility of this splicing model using practical examples
from risk measurement
Estimating the maximum possible earthquake magnitude using extreme value methodology: the Groningen case
The area-characteristic, maximum possible earthquake magnitude is
required by the earthquake engineering community, disaster management agencies
and the insurance industry. The Gutenberg-Richter law predicts that earthquake
magnitudes follow a truncated exponential distribution. In the geophysical
literature several estimation procedures were proposed, see for instance Kijko
and Singh (Acta Geophys., 2011) and the references therein. Estimation of
is of course an extreme value problem to which the classical methods for
endpoint estimation could be applied. We argue that recent methods on truncated
tails at high levels (Beirlant et al., Extremes, 2016; Electron. J. Stat.,
2017) constitute a more appropriate setting for this estimation problem. We
present upper confidence bounds to quantify uncertainty of the point estimates.
We also compare methods from the extreme value and geophysical literature
through simulations. Finally, the different methods are applied to the
magnitude data for the earthquakes induced by gas extraction in the Groningen
province of the Netherlands
Quasi-Likelihood Estimation of Benchmark Rates for Excess of Loss Reinsurance Programs
In this paper a method for determining benchmark rates for the excess of loss reinsurance of a Motor Third Party Liability insurance portfolio will be developed based on observed market rates. The benchmark rates are expressed as a percentage of the expected premium income that is available to cover the whole risk of the portfolio. The rates are assumed to be based on a compound process with a heavy tailed severity, such as Burr or Pareto distributions. In the absence of claim data these assumptions propagate the theoretical benchmark rate component of the regression model. Given the whole set of excess of loss reinsurance rates in a given market, the unknown parameters are estimated within the framework of quasi-likelihood estimation. This framework makes it possible to select a theoretical benchmark rate model and to choose a parsimonious submodel for describing the observed market rates over a 4-years observation period. This method is applied to the Belgian Motor Third Party Liability excess of loss rates observed during the years 2001 till 200
- …